Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Abstract This article provides an overview of the current state of machine learning in gravitational-wave research with interferometric detectors. Such applications are often still in their early days, but have reached sufficient popularity to warrant an assessment of their impact across various domains, including detector studies, noise and signal simulations, and the detection and interpretation of astrophysical signals. In detector studies, machine learning could be useful to optimize instruments like LIGO, Virgo, KAGRA, and future detectors. Algorithms could predict and help in mitigating environmental disturbances in real time, ensuring detectors operate at peak performance. Furthermore, machine-learning tools for characterizing and cleaning data after it is taken have already become crucial tools for achieving the best sensitivity of the LIGO–Virgo–KAGRA network. In data analysis, machine learning has already been applied as an alternative to traditional methods for signal detection, source localization, noise reduction, and parameter estimation. For some signal types, it can already yield improved efficiency and robustness, though in many other areas traditional methods remain dominant. As the field evolves, the role of machine learning in advancing gravitational-wave research is expected to become increasingly prominent. This report highlights recent advancements, challenges, and perspectives for the current detector generation, with a brief outlook to the next generation of gravitational-wave detectors.more » « lessFree, publicly-accessible full text available December 1, 2026
-
Free, publicly-accessible full text available September 1, 2026
-
We present an enhanced method for the application of Gaussian mixture modeling (GMM) to the coherent WaveBurst (cWB) algorithm in the search for short-duration gravitational wave (GW) transients. The supervised machine learning method of GMM allows for the multidimensional distributions of noise and signal to be modeled over a set of representative attributes, which aids in the classification of GW signals against noise transients (glitches) in the data. We demonstrate that updating the approach to model construction eliminates bias previously seen in the GMM analysis, increasing the robustness and sensitivity of the analysis over a wider range of burst source populations. The enhanced methodology is applied to the generic burst all-sky short search in the LIGO-Virgo full third observing run (O3), marking the first application of GMM to the 3 detector Livingston-Hanford-Virgo network. For both 2- and 3- detector networks, we observe comparable sensitivities to an array of generic signal morphologies, with significant sensitivity improvements to waveforms in the low quality factor parameter space at false alarm rates of 1 per 100 years. This proves that GMM can effectively mitigate blip glitches, which are one of the most problematic sources of noise for unmodeled GW searches. The cWB-GMM search recovers similar numbers of compact binary coalescence (CBC) events as other cWB postproduction methods, and concludes on no new gravitational wave detection after known CBC events are removed. Published by the American Physical Society2024more » « less
An official website of the United States government

Full Text Available